Progressive Unsupervised Person Re-Identification by Tracklet Association With Spatio-Temporal Regularization

نویسندگان

چکیده

Existing methods for person re-identification (Re-ID) are mostly based on supervised learning which requires numerous manually labeled samples across all camera views training. Such a paradigm suffers the scalability issue since in real-world Re-ID application, it is difficult to exhaustively label abundant identities over multiple disjoint views. To this end, we propose progressive deep method unsupervised wild by Tracklet Association with Spatio-Temporal Regularization (TASTR). In our approach, first collect tracklet data within each automatic detection and tracking. Then, an initial model trained within-camera triplet construction representation learning. After that, visual feature spatio-temporal constraint, associate cross-camera tracklets generate triplets update model. Lastly, refined model, better of can be extracted, further promote association tracklets. The last two steps iterated times progressively upgrade facilitate study, have collected new 4K UHD video dataset named Campus4K full frames information. Experimental results show that constraint training phase, proposed approach outperforms state-of-the-art notable margins DukeMTMC-reID, achieves competitive performance fully both DukeMTMC-reID datasets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Person re-identification by unsupervised video matching

Most existing person re-identification (ReID) methods rely only on the spatial appearance information from either one or multiple person images, whilst ignore the space-time cues readily available in video or imagesequence data. Moreover, they often assume the availability of exhaustively labelled cross-view pairwise data for every camera pair, making them non-scalable to ReID applications in r...

متن کامل

Part-based spatio-temporal model for multi-person re-identification

0167-8655/$ see front matter 2011 Elsevier B.V. A doi:10.1016/j.patrec.2011.09.005 ⇑ Corresponding author. E-mail addresses: [email protected] (A. Beda h.edu (S.K. Shah). In this paper we propose an adaptive part-based spatio-temporal model that characterizes person’s appearance using color and facial features. Face image selection based on low level cues is used to select usable face images...

متن کامل

Person Re-identification by Unsupervised `1 Graph Learning

Most existing person re-identification (Re-ID) methods are based on supervised learning of a discriminative distance metric. They thus require a large amount of labelled training image pairs which severely limits their scalability. In this work, we propose a novel unsupervised Re-ID approach which requires no labelled training data yet is able to capture discriminative information for cross-vie...

متن کامل

Unsupervised Cross-dataset Person Re-identification by Transfer Learning of Spatial-Temporal Patterns

Most of the proposed person re-identification algorithms conduct supervised training and testing on single labeled datasets with small size, so directly deploying these trained models to a large-scale real-world camera network may lead to poor performance due to underfitting. It is challenging to incrementally optimize the models by using the abundant unlabeled data collected from the target do...

متن کامل

Temporal Model Adaptation for Person Re-identification

Person re-identification is an open and challenging problem in computer vision. Majority of the efforts have been spent either to design the best feature representation or to learn the optimal matching metric. Most approaches have neglected the problem of adapting the selected features or the learned model over time. To address such a problem, we propose a temporal model adaptation scheme with ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Multimedia

سال: 2021

ISSN: ['1520-9210', '1941-0077']

DOI: https://doi.org/10.1109/tmm.2020.2985525